翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

differential entropy : ウィキペディア英語版
differential entropy
Differential entropy (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions.
==Definition==
Let ''X'' be a random variable with a probability density function ''f'' whose support is a set \mathbb X. The ''differential entropy'' ''h''(''X'') or ''h''(''f'') is defined as
:h(X) = -\int_\mathbb f(x)\log f(x)\,dx.
For probability distributions which don't have an explicit density function expression, but have an explicit quantile function expression, ''Q''(''p''), then ''h''(''Q'') can be defined in terms of the derivative of ''Q''(''p'') i.e. the quantile density function ''Q(''p'') as
:h(Q) = \int_0^1 \log Q'(p)\,dp.
As with its discrete analog, the units of differential entropy depend on the base of the logarithm, which is usually 2 (i.e., the units are bits). See logarithmic units for logarithms taken in different bases. Related concepts such as joint, conditional differential entropy, and relative entropy are defined in a similar fashion. Unlike the discrete analog, the differential entropy has an offset that depends on the units used to measure ''X''.〔Pages 183-184, 〕 For example, the differential entropy of a quantity measured in millimeters will be log(1000) more than the same quantity measured in meters; a dimensionless quantity will have differential entropy of log(1000) more than the same quantity divided by 1000.
One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, Uniform(0,1/2) has ''negative'' differential entropy
:\int_0^\frac -2\log(2)\,dx=-\log(2)\,.
Thus, differential entropy does not share all properties of discrete entropy.
Note that the continuous mutual information ''I''(''X'';''Y'') has the distinction of retaining its fundamental significance as a measure of discrete information since it is actually the limit of the discrete mutual information of ''partitions'' of ''X'' and ''Y'' as these partitions become finer and finer. Thus it is invariant under non-linear homeomorphisms (continuous and uniquely invertible maps)
, including linear transformations of ''X'' and ''Y'', and still represents the amount of discrete information that can be transmitted over a channel that admits a continuous space of values.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「differential entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.